In the face of shrinking process technologies and rising clock frequencies, designers must cope with escalating crosstalk noise effects that dramatically impact the timing of complex digital circuits. Designers find that circuits that seem to pass timing analysis with existing EDA tools actually fail in silicon due to crosstalk and other signal-integrity effects. As complex IC designs migrate to ultra deep submicron (UDSM) process technologies, 0.18 um and below, designers need to move to noise-aware timing analysis methods to address the critical impact of noise on their ability to achieve timing convergence of UDSM system-on-a-chip (SOC) designs.
For timing signoff, IC designers typically perform timing verification using static timing analysis. Unfortunately, existing static timing tools ignore one of the key UDSM electrical side effects-crosstalk-which is created by the simultaneous switching of neighboring wires. Crosstalk effects can dramatically affect the integrity of switching signals by distorting their waveform shape and altering their time of flight. If these electrical effects aren't handled during timing analysis, circuits that pass traditional timing analysis will often fail when manufactured in silicon. Designers using UDSM process technologies are finding that they need additional design iterations after manufacturing because measured silicon performance is very different from timing tool predictions. This failure to achieve design closure has become one of the key factors in SOC product-release delays.
Unfortunately, the impact of noise and signal integrity effects will only grow with increasingly sophisticated UDSM processes and current SOC trends. With each new generation of UDSM process technology, feature sizes, wire widths and wire spacings continue to shrink further. Yet, die sizes remain relatively constant, because in this age of systems on silicon, designers use the space advantage to squeeze more function into SOCs. As a result, average wire length has remained relatively constant despite the decreasing pitch.
Additionally, each reduction of wire width means a decrease in total wire capacitance, with a corresponding increase in the fraction of wire capacitance represented by lateral coupling. Further exacerbating these problems, the continued demand for improved performance translates to requirements for higher clock frequencies with much faster switching signals. The faster a signal switches, the more circuit noise is coupled onto neighboring lines. Consequently, in this environment of ultra-dense, high-speed SOCs, the need for measuring the impact of crosstalk is critical for determining if a design will function as expected at the required performance levels.
Crosstalk effects
In the example shown, the additional delay caused by crosstalk is greater than the delay of the gate without crosstalk (see Figure 1). In fact, this example isn't unusual, because it's not uncommon for crosstalk to change the delay of a signal by over 100 percent. If the affected signal forms part of a critical maximum delay path, then the extra delay due to crosstalk can result in a setup failure where the signal arrives late at a latch or a flip-flop.
Crosstalk can also decrease delays (see Figure 2). If this noise event occurs on a critical minimum delay path, it can lead to hold violations where data arrives early at a latch or a flip-flop. Setup failures are of course undesirable, but they can be removed by slowing down the clock-at the cost of lower overall SOC performance. Alternatively, hold violations can only be fixed by silicon mask changes, at a tremendous cost in dollars and time lost.
To account accurately for crosstalk effects on delay, designers need special techniques to deal with the volume of parasitic data present in UDSM design: the non-monotonic waveform shapes that occur when aggressor signals switch, as well as the complex timing and logic relationships between aggressors and victims. Crosstalk-induced delay analysis starts with the identification of the potential victim nets-nets that have enough coupling capacitance to warrant investigation. Using advanced interconnect analysis techniques, crosstalk analysis tools-for instance, CeltIC from CadMOS (San Jose, CA)-handle this step by analyzing a design to find those nets whose glitch noise exceeds a particular threshold. Next, the tool calculates the typical delay for each victim net by simulating its drivers and receivers with all possible aggressor nets held quiet. The crosstalk analysis tool then switches aggressor nets in the same directions as the victim net to calculate the decrease in minimum delay, and then in the opposite directions to calculate the increase in maximum delay.
The desired measure of the impact of crosstalk on delay is, therefore, the difference between the delay for the typical delay case and the delay for the minimum and maximum delay cases. These calculations can be significantly influenced by a number of factors, including the slew rates of the victim and aggressor signals, and the alignment of the aggressor switching relative to the victim switching (see Figures 3 and 4).
The task of predicting worst-case "noise-on-delay" effects becomes even more complicated when a victim net has multiple attackers with varying coupling capacitances. Typically, the worst-case crosstalk occurs when all attackers switch in the same direction so that their noise peaks align. Due to both the arrival times of signals and the exclusivity of signals, however, this worst-case situation may never happen. To avoid over-estimating crosstalk effects, analysis tools account for the windows of time during which signals are likely to switch when grouping together aggressor signals.
The intelligence factor
Although tools can intelligently factor switching windows and logic relationships into calculations, it's still possible to overestimate noise-on-delay effects. For example, if a victim signal runs orthogonal to a multi-bit bus, a small amount of coupling capacitance occurs between the victim and bus at each crossing point.
Individually, this small coupling is insignificant, but the combination of effects can't be ignored. In this case, the worst-case crosstalk scenario occurs when all bits of the bus switch simultaneously. Unless the bus bits are explicitly designed to do so, the probability of all of the bus bits switching together in the same direction is typically very low. Although the effect of these small couplings can't be ignored, any approach that assumed simultaneous switching would generate a very pessimistic noise-on-delay result. Instead, many tools account for the statistical probability of simultaneous switching when calculating noise-on-delay effects.
Crosstalk delay is clearly dependent on the arrival times of signals and their slew rates. Consequently, a correct analysis of a circuit's true timing behavior necessarily demands that crosstalk delay effects must be incorporated into timing analysis. Due to this mutual interdependence of crosstalk and timing, it only follows that the correct approach for timing verification requires a method that accounts for both of these effects.
The most promising approach is noise-aware timing analysis, which can alternate between crosstalk noise analysis and timing analysis, feeding the results of each analysis to the other.
This iterative approach starts with an assumption of worst-case conditions, iteratively refining the results. The designers first find worst-case crosstalk effects on delay by assuming the fastest possible switching on all aggressor nets and ignoring timing relationships between aggressor and victim nets. They next apply this worst-case noise result to the worst-case noise-free timing to obtain worst-case noise with timing. Designers can use these results to generate more accurate switching windows that can in turn be used for the next iteration of the noise calculation and so on until the analysis converges-typically within only a few iterations.
Video processor
This noise-aware timing analysis approach was used to analyze a block that was part of a 0.18-micron video processor. The block comprised 30K placed instances (420K transistors), 32K logical nets (990K physical nodes), and 3M parasitic elements (2.5M of which were coupling capacitors). This block met all timing goals as determined by timing analysis performed without accounting for noise effects. After the first iteration of noise analysis, 12.5 percent of nets were found to be impacted by noise, with delay deltas ranging from 100 ps to 260 ps. When these incremental delay changes were fed to static timing analysis, 11 paths had setup failures and the worst path had a negative slack of 670 ps. Using timing windows and transition times from the first static-timing iteration, the second iteration of noise analysis reported that 6.25 percent of all nets had their delays impacted by crosstalk. Feeding these delay changes to static timing yielded 10 failing paths. The process converged on the third iteration which also gave 10 failing paths.
As these results demonstrate, crosstalk effects significantly impact the timing of digital circuits. What's more, the impact will rise dramatically as designers migrate to smaller process technologies and faster clock frequencies. Indeed noise effects can easily change timing results by over 20 percent. As a result, noise-aware timing analysis has become essential for accurate timing convergence in complex, high-speed SOCs prior to silicon manufacturing.
In the face of shrinking process technologies and rising clock frequencies, designers must cope with escalating crosstalk noise effects that dramatically impact the timing of complex digital circuits. Designers find that circuits that seem to pass timing analysis with existing EDA tools actually fail in silicon due to crosstalk and other signal-integrity effects. As complex IC designs migrate to ultra deep submicron (UDSM) process technologies, 0.18 um and below, designers need to move to noise-aware timing analysis methods to address the critical impact of noise on their ability to achieve timing convergence of UDSM system-on-a-chip (SOC) designs.
For timing signoff, IC designers typically perform timing verification using static timing analysis. Unfortunately, existing static timing tools ignore one of the key UDSM electrical side effects-crosstalk-which is created by the simultaneous switching of neighboring wires. Crosstalk effects can dramatically affect the integrity of switching signals by distorting their waveform shape and altering their time of flight. If these electrical effects aren't handled during timing analysis, circuits that pass traditional timing analysis will often fail when manufactured in silicon. Designers using UDSM process technologies are finding that they need additional design iterations after manufacturing because measured silicon performance is very different from timing tool predictions. This failure to achieve design closure has become one of the key factors in SOC product-release delays.
Unfortunately, the impact of noise and signal integrity effects will only grow with increasingly sophisticated UDSM processes and current SOC trends. With each new generation of UDSM process technology, feature sizes, wire widths and wire spacings continue to shrink further. Yet, die sizes remain relatively constant, because in this age of systems on silicon, designers use the space advantage to squeeze more function into SOCs. As a result, average wire length has remained relatively constant despite the decreasing pitch.
Additionally, each reduction of wire width means a decrease in total wire capacitance, with a corresponding increase in the fraction of wire capacitance represented by lateral coupling. Further exacerbating these problems, the continued demand for improved performance translates to requirements for higher clock frequencies with much faster switching signals. The faster a signal switches, the more circuit noise is coupled onto neighboring lines. Consequently, in this environment of ultra-dense, high-speed SOCs, the need for measuring the impact of crosstalk is critical for determining if a design will function as expected at the required performance levels.
Crosstalk effects
In the example shown, the additional delay caused by crosstalk is greater than the delay of the gate without crosstalk (see Figure 1). In fact, this example isn't unusual, because it's not uncommon for crosstalk to change the delay of a signal by over 100 percent. If the affected signal forms part of a critical maximum delay path, then the extra delay due to crosstalk can result in a setup failure where the signal arrives late at a latch or a flip-flop.
Crosstalk can also decrease delays (see Figure 2). If this noise event occurs on a critical minimum delay path, it can lead to hold violations where data arrives early at a latch or a flip-flop. Setup failures are of course undesirable, but they can be removed by slowing down the clock-at the cost of lower overall SOC performance. Alternatively, hold violations can only be fixed by silicon mask changes, at a tremendous cost in dollars and time lost.
To account accurately for crosstalk effects on delay, designers need special techniques to deal with the volume of parasitic data present in UDSM design: the non-monotonic waveform shapes that occur when aggressor signals switch, as well as the complex timing and logic relationships between aggressors and victims. Crosstalk-induced delay analysis starts with the identification of the potential victim nets-nets that have enough coupling capacitance to warrant investigation. Using advanced interconnect analysis techniques, crosstalk analysis tools-for instance, CeltIC from CadMOS (San Jose, CA)-handle this step by analyzing a design to find those nets whose glitch noise exceeds a particular threshold. Next, the tool calculates the typical delay for each victim net by simulating its drivers and receivers with all possible aggressor nets held quiet. The crosstalk analysis tool then switches aggressor nets in the same directions as the victim net to calculate the decrease in minimum delay, and then in the opposite directions to calculate the increase in maximum delay.
The desired measure of the impact of crosstalk on delay is, therefore, the difference between the delay for the typical delay case and the delay for the minimum and maximum delay cases. These calculations can be significantly influenced by a number of factors, including the slew rates of the victim and aggressor signals, and the alignment of the aggressor switching relative to the victim switching (see Figures 3 and 4).
The task of predicting worst-case "noise-on-delay" effects becomes even more complicated when a victim net has multiple attackers with varying coupling capacitances. Typically, the worst-case crosstalk occurs when all attackers switch in the same direction so that their noise peaks align. Due to both the arrival times of signals and the exclusivity of signals, however, this worst-case situation may never happen. To avoid over-estimating crosstalk effects, analysis tools account for the windows of time during which signals are likely to switch when grouping together aggressor signals.
The intelligence factor
Although tools can intelligently factor switching windows and logic relationships into calculations, it's still possible to overestimate noise-on-delay effects. For example, if a victim signal runs orthogonal to a multi-bit bus, a small amount of coupling capacitance occurs between the victim and bus at each crossing point.
Individually, this small coupling is insignificant, but the combination of effects can't be ignored. In this case, the worst-case crosstalk scenario occurs when all bits of the bus switch simultaneously. Unless the bus bits are explicitly designed to do so, the probability of all of the bus bits switching together in the same direction is typically very low. Although the effect of these small couplings can't be ignored, any approach that assumed simultaneous switching would generate a very pessimistic noise-on-delay result. Instead, many tools account for the statistical probability of simultaneous switching when calculating noise-on-delay effects.
Crosstalk delay is clearly dependent on the arrival times of signals and their slew rates. Consequently, a correct analysis of a circuit's true timing behavior necessarily demands that crosstalk delay effects must be incorporated into timing analysis. Due to this mutual interdependence of crosstalk and timing, it only follows that the correct approach for timing verification requires a method that accounts for both of these effects.
The most promising approach is noise-aware timing analysis, which can alternate between crosstalk noise analysis and timing analysis, feeding the results of each analysis to the other.
This iterative approach starts with an assumption of worst-case conditions, iteratively refining the results. The designers first find worst-case crosstalk effects on delay by assuming the fastest possible switching on all aggressor nets and ignoring timing relationships between aggressor and victim nets. They next apply this worst-case noise result to the worst-case noise-free timing to obtain worst-case noise with timing. Designers can use these results to generate more accurate switching windows that can in turn be used for the next iteration of the noise calculation and so on until the analysis converges-typically within only a few iterations.
Video processor
This noise-aware timing analysis approach was used to analyze a block that was part of a 0.18-micron video processor. The block comprised 30K placed instances (420K transistors), 32K logical nets (990K physical nodes), and 3M parasitic elements (2.5M of which were coupling capacitors). This block met all timing goals as determined by timing analysis performed without accounting for noise effects. After the first iteration of noise analysis, 12.5 percent of nets were found to be impacted by noise, with delay deltas ranging from 100 ps to 260 ps. When these incremental delay changes were fed to static timing analysis, 11 paths had setup failures and the worst path had a negative slack of 670 ps. Using timing windows and transition times from the first static-timing iteration, the second iteration of noise analysis reported that 6.25 percent of all nets had their delays impacted by crosstalk. Feeding these delay changes to static timing yielded 10 failing paths. The process converged on the third iteration which also gave 10 failing paths.
As these results demonstrate, crosstalk effects significantly impact the timing of digital circuits. What's more, the impact will rise dramatically as designers migrate to smaller process technologies and faster clock frequencies. Indeed noise effects can easily change timing results by over 20 percent. As a result, noise-aware timing analysis has become essential for accurate timing convergence in complex, high-speed SOCs prior to silicon manufacturing.
Charlie Huang is the co-founder, chairman and CEO of CadMOS Technologies (San Jose, CA). Previously, he was the vice president of R&D of the EPIC Technology Group at Synopsys before co-founding CadMOS. At EPIC he was responsible for developing Power Mill and subsequently Path Mill. He holds a US patent on piecewise linear event-driven simulation.
To voice an opinion on this or any other article in
Integrated System Design, please e-mail your comments to sdean@cmp.comd